In a world increasingly dominated by screens, our physical movements often feel at odds with the digital world. We pause our strides to type messages, halt our journeys to adjust music, and interrupt our walks to answer calls. Yet, what if the very act of walking itself could become a seamless and intuitive form of interaction? This research explores Gait Gestures, an new approach that transforms the subtle variations in our strides and foot strikes into an input language understood by augmented reality (AR) interfaces. In this projects, we explores the feasibility by investigating different gait gestures in terms of user experience and recognizability, followed by a series of “walk”-through with common AR apps. Additionally, we showcases its feasibility with technical recognizer.
I like to take a walk—it’s my go-to way to decompress after a long day and an essential part of my creative thinking process (Related reading: Give Your Ideas Some Legs: The Positive Effect of Walking on Creative Thinking). Beyond that, this project is inspired by fantastical tales where a character’s steps conjure magical spells or reveal hidden paths. Just like a princess’s twirl might summon a shimmering gown or a hero’s leap could ignite a blazing sword, what if a well-placed foot tap could activate a music playlist, or an elongated stride could adjust the volume?